As of May 19, 2025, The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act, has been signed into law. The bill makes it a federal crime to knowingly share or threaten to share intimate images, including AI-generated intimate images depicting real people (aka “deepfakes”).  It also requires platforms to remove such material within 48 hours of notice from a victim and “make reasonable efforts to identify and remove any known identical copies of such depiction.”

After years of activists fighting for a federal law criminalizing image-based sexual abuse (formerly known as “revenge porn”), the bill is a long overdue win. However, questions regarding its enforcement, efficacy, and reliability continue to go unanswered. This guide looks to clarify some unknowns regarding how the act will be implemented, what it seeks to achieve, and how it will be enforced.

Here’s a brief overview of the TAKE IT DOWN Act:

Key Provisions:

1. Federal Crime to “Knowingly” Share/Threaten to Share Intimate Images Without Consent

Finally, a federal bill that recognizes the detrimental effects of the non-consensual sharing of intimate images (NDII) — and the impacts of the threats to disseminate as well. The Take it Down Act seeks to hold perpetrators criminally responsible for knowingly sharing and/or threatening to share intimate images without consent. It also ensures platforms respond to takedown requests in a timely manner.

2. Inclusion of AI-generated Deepfakes

With the rise of AI, assholes, trolls, crazy exes, or really any pervert with an Internet connection can abuse the multitude of websites that offer deepfake generating services. Perpetrators use these products to create sexual/pornographic images of unsuspecting victims and, without their consent, post them online for anyone to see. There’s usually more than one victim when it comes to deepfakes – those whose face is imposed on an existing explicit photo or video, and those whose body (and labor, in the case of adult performers) are pirated without their consent.

Victims are often unaware that images or videos depicting them are circulating online until it’s too late—usually by discovering them by chance or after being sent  content they never took and never knew existed.

The Take it Down Act recognizes the harm of AI-generated images by explicitly including them in the criminalization of NDII.

But Who Is Protected?

The broad language of the bill is intended to cover all forms of AI-generated intimate imagery. However, it remains unclear whether the law will protect every victim– particularly in cases where a victim’s face is not shown. For instance, if one individual’s face is digitally superimposed onto another person’s body, will the Take It Down Act extend protection to both victims involved? Both the victim whose face is depicted, and the victim whose body is depicted without consent?

The bill defines non-consensual intimate imagery (NCII) to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. Yet, for victims whose bodies are used without consent– especially when their faces are not visible– legal protection under the Act remains uncertain.

3. ‘Online Platforms’ 48-hour Image Removal

Under this law, online platforms have 48 hours to remove a reported nonconsensual intimate image. They also must implement a clear and accessible method of submitting complaints and must notify users of the process upon its implementation.

Images Covered:

The Take it Down Act narrowly covers sexually explicit images that are shared or created without the victim’s consent. Specifically:

  • Visible private parts of a person you can recognize:
    • Genitals
    • Pubic area
    • Anus
    • Nipples of a post-pubescent female
  • Showing or sharing sexual bodily fluids
  • Clear or realistic depictions of:
    • Sexual intercourse
    • Masturbation
    • Abuse involving pain for sexual pleasure (sadism or masochism)
    • Sex with animals (bestiality)
  • Clear or realistic display (or fake-but-very-real-looking) of:
    • Anus
    • Genitals
    • Pubic area in a sexual way

How You Can Use the Take it Down Act

The act implements a “notice-and-removal” mechanism for an individual to request that an image be removed by the platform. Covered platforms have one year from the signing of the bill (May 19, 2026) to implement a clear, easily accessible, and efficient method for users to submit take down requests.

Once a platform receives a valid removal request, it will have 48 hours to remove the content. The bill also requires platforms to make “reasonable efforts” to takedown duplicates and reposts of the content, though it does stop short of defining what that really entails.

What makes a Removal Request “Valid”?

In order for a removal request to be considered valid, it must be in writing and include:

  • A physical or electronic signature of the individual making the request (or their representative);
  • Identification of the content and enough information for the platform to locate it;
  • A brief statement of the individual’s belief that the depiction was not consensual; and
  • The individual’s contact information.

Enforcement

The power of enforcement against the platforms is given to the FTC (Federal Trade Commission). Covered platforms can face fines of up to $50,000 per incident, and individual perpetrators face fines and potential imprisonment of up to three years.

There are growing concerns over whether the FTC would be able to effectively enforce this bill, especially with the immense budget cuts from the Trump administration. Are they in over their heads?

Melania Trump has expressed interest in championing the bill’s implementation, meaning the White House wants to be heavily involved in its enforcement. Of course, this raises both potential benefits and serious concerns, as Donald Trump himself told Congress he would use the bill for himself, claiming, “because nobody gets treated worse than I do online, nobody.”

Tell that to the countless victims of image-based sexual abuse, online abuse, harassment, exploitation, and stalking, Mr. President.

Wait… What About Section 230?

Will the Take It Down Act finally remove the shield of Section 230 that tech giants have been hiding behind for all this time?

Unfortunately, no.

Section 230 of the Communications Decency Act is an outdated law from 1996that protects tech platforms from being held legally responsible for most of the content their users post. This means that even if a platform literally creates the space for non-consensual or abusive content to exist and spread, the platform itself generally can’t be sued by the person harmed by the content, for simply hosting it.

So, even though these companies enable the spread of harmful material, and profit from the engagement it generates, they’ve historically been able to dodge accountability under the claim: “We didn’t post it– someone else did.”

The FTC can step in and treat that failure as a violation of federal law, opening the door to penalties and enforcement action. However, Take It Down Act does not open up civil liability. Victims still cannot sue a platform and get damages if their content is posted without consent and left online. Enforcement involves tech companies paying fines to the FTC if they fail to comply, but victims are still left without recourse against the platforms.

Significance

The Take It Down Act gives us a glimpse into how legislation can help make the Internet a safer place. It offers victims a way to fight back against the spread of non-consensual and AI-generated intimate images and presents a legal pathway to seek justice. It’s a strong step toward reclaiming digital spaces and giving power back to those who’ve had it taken from them.